نتایج جستجو برای: Poisson entropy

تعداد نتایج: 98442  

پایان نامه :وزارت علوم، تحقیقات و فناوری - دانشگاه صنعتی اصفهان - دانشکده ریاضی 1390

the main objective in sampling is to select a sample from a population in order to estimate some unknown population parameter, usually a total or a mean of some interesting variable. a simple way to take a sample of size n is to let all the possible samples have the same probability of being selected. this is called simple random sampling and then all units have the same probability of being ch...

پایان نامه :دانشگاه تربیت معلم - تهران - دانشکده علوم ریاضی و مهندسی کامپیوتر 1386

چکیده ندارد.

The Kolmogorov-Sinai entropy is a far reaching dynamical generalization of Shannon entropy of information systems. This entropy works perfectly for probability measure preserving (p.m.p.) transformations. However, it is not useful when there is no finite invariant measure. There are certain successful extensions of the notion of entropy to infinite measure spaces, or transformations with ...

Journal: :Annales de l'Institut Henri Poincaré, Probabilités et Statistiques 2012

2012
Igal Sason

The first part of this work considers the entropy of the sum of (possibly dependent and non-identically distributed) Bernoulli random variables. Upper bounds on the error that follows from an approximation of this entropy by the entropy of a Poisson random variable with the same mean are derived via the Chen-Stein method. The second part of this work derives new lower bounds on the total variat...

Journal: :CoRR 2012
Igal Sason

The first part of this work considers the entropy of the sum of (possibly dependent and non-identically distributed) Bernoulli random variables. Upper bounds on the error that follows from an approximation of this entropy by the entropy of a Poisson random variable with the same mean are derived via the Chen-Stein method. The second part of this work derives new lower bounds on the total variat...

Journal: :Discrete Applied Mathematics 2013
Oliver Johnson Ioannis Kontoyiannis Mokshay M. Madiman

Sufficient conditions are developed, under which the compound Poisson distribution has maximal entropy within a natural class of probability measures on the nonnegative integers. Recently, one of the authors [O. Johnson, Stoch. Proc. Appl., 2007] used a semigroup approach to show that the Poisson has maximal entropy among all ultra-log-concave distributions with fixed mean. We show via a non-tr...

2008
I. Kontoyiannis O. T. Johnson M. Madiman

An information-theoretic foundation for compound Poisson approximation limit theorems is presented, in analogy to the corresponding developments for the central limit theorem and for simple Poisson approximation. It is shown that the compound Poisson distributions satisfy a natural maximum entropy property within a natural class of distributions. Simple compound Poisson approximation bounds are...

2009
Mohammad Rezaeian

We define the Boltzmann-Gibbs entropy of random finite set on a general space X as the integration of logarithm of density function over the space of finite sets of X , where the measure for this integration is the dominating measure over this space. We show that with a unit adjustment term, the same value for entropy can be obtained using the calculus of set integrals which applies integration...

2012
Thierry de la Rue

We prove that the notions of Krengel entropy and Poisson entropy for infinite-measure-preserving transformations do not always coincide: We construct a conservative infinite-measure-preserving transformation with zero Krengel entropy (the induced transformation on a set of measure 1 is the Von Neumann–Kakutani odometer), but whose associated Poisson suspension has positive entropy. Résumé. Nous...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید